Optimization Equivalence of Divergences Improves Neighbor Embedding(Supplemental Document)
نویسندگان
چکیده
http://archive.ics.uci.edu/ml/ http://yann.lecun.com/exdb/mnist/ http://vlado.fmf.uni-lj.si/pub/networks/ data/ http://code.google.com/p/linloglayout/ http://www.music-ir.org/mirex/wiki/2007 Chen et al. (2009). It is a similarity graph of 3090 songs. The songs are evenly divided among 10 classes that roughly correspond to different music genres. The weighted edges are human judgment on how similar two songs are.
منابع مشابه
Optimization Equivalence of Divergences Improves Neighbor Embedding
Visualization methods that arrange data objects in 2D or 3D layouts have followed two main schools, methods oriented for graph layout and methods oriented for vectorial embedding. We show the two previously separate approaches are tied by an optimization equivalence, making it possible to relate methods from the two approaches and to build new methods that take the best of both worlds. In detai...
متن کاملStochastic neighbor embedding (SNE) for dimension reduction and visualization using arbitrary divergences
We present a systematic approach to the mathematical treatment of the t-distributed stochastic neighbor embedding (t-SNE) and the stochastic neighbor embedding (SNE) method. This allows an easy adaptation of the methods or exchange of their respective modules. In particular, the divergence which measures the difference between probability distributions in the original and the embedding space ca...
متن کاملType 1 and 2 mixtures of Kullback-Leibler divergences as cost functions in dimensionality reduction based on similarity preservation
Stochastic neighbor embedding (SNE) and its variants are methods of dimensionality reduction (DR) that involve normalized softmax similarities derived from pairwise distances. These methods try to reproduce in the low-dimensional embedding space the similarities observed in the high-dimensional data space. Their outstanding experimental results, compared to previous state-of-the-art methods, or...
متن کاملType 1 and 2 symmetric divergences for stochastic neighbor embedding
Stochastic neighbor embedding (SNE) is a method of dimensionality reduction (DR) that involves softmax similarities measured between all pairs of data points. In order to build a low-dimensional embedding, SNE tries to reproduce the similarities observed in the highdimensional data space. The capability of softmax similarities to fight the phenomenon of norm concentration has been studied in pr...
متن کاملMathematical Foundations of the Self Organized Neighbor Embedding (SONE) for Dimension Reduction and Visualization
Abstract. In this paper we propose the generalization of the recently introduced Neighbor Embedding Exploratory Observation Machine (NEXOM) for dimension reduction and visualization. We provide a general mathematical framework called Self Organized Neighbor Embedding (SONE). It treats the components, like data similarity measures and neighborhood functions, independently and easily changeable. ...
متن کامل